-
Notifications
You must be signed in to change notification settings - Fork 6
Enhance conversation handling: clone entries to avoid mutation, extra… #1631
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ct text from parts, and preserve new user prompts in conversation history
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR improves conversation handling in the Gemini AI integration by preventing mutation of the original conversation history and ensuring only text content is extracted from message parts. The changes address issues with how conversation entries are cloned, how text is filtered from multi-part messages, and how new user prompts are preserved when function calls occur.
Changes:
- Clone conversation entries before processing to avoid mutating the original data
- Filter message parts to extract only text content, skipping function calls and responses
- Preserve new user prompts separately to ensure they're correctly added to conversation history during function call handling
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // Only extract text from parts that have text (skip functionCall, functionResponse, etc.) | ||
| var textParts = _entry.parts.filter(p => isDef(p) && isDef(p.text)) | ||
| if (textParts.length > 0) { | ||
| _entry.content = textParts.reduce((aC, aV) => aC + "\n" + aV.text, "") |
Copilot
AI
Jan 16, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.
| _entry.content = textParts.reduce((aC, aV) => aC + "\n" + aV.text, "") | |
| _entry.content = textParts.map(p => p.text).join("\n") |
| return __r.candidates[0].content.parts.reduce((aC, aV) => aC + "\n" + aV.text, "") | ||
| // Only extract text from parts that have text (skip functionCall, functionResponse, etc.) | ||
| var textParts = __r.candidates[0].content.parts.filter(p => isDef(p) && isDef(p.text)) | ||
| return textParts.reduce((aC, aV) => aC + "\n" + aV.text, "") |
Copilot
AI
Jan 16, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.
| return __r.candidates[0].content.parts.reduce((aC, aV) => aC + "\n" + aV.text, "") | ||
| // Only extract text from parts that have text (skip functionCall, functionResponse, etc.) | ||
| var textParts = __r.candidates[0].content.parts.filter(p => isDef(p) && isDef(p.text)) | ||
| return textParts.reduce((aC, aV) => aC + "\n" + aV.text, "") |
Copilot
AI
Jan 16, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reduce operation always prepends a newline, resulting in leading whitespace when concatenating text parts. Initialize the accumulator with the first element's text or use join('\n') instead.
This pull request refines how conversation messages and model responses are processed in
js/owrap.ai.jsto improve handling of message parts and ensure correct conversation history management. The main changes focus on extracting only relevant text from message parts, preventing mutation of original data, and correctly appending new user prompts and model responses to the conversation.Conversation message handling improvements:
getConversationmethod to clone conversation entries before processing, preventing mutation of the original conversation history. It now extracts only the text from message parts, skipping non-text parts such as function calls or responses. ([js/owrap.ai.jsL519-R537](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L519-R537))[[1]](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L579-R590),[[2]](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L625-R638))Conversation history management:
[js/owrap.ai.jsR667-R675](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165R667-R675))[js/owrap.ai.jsL767-R790](https://github.com/OpenAF/openaf/pull/1631/files#diff-184d2cbefa7d7f91b25bf24fe2d80ee95cbe8ed9824a9c6582bfb72d4221d165L767-R790))…ct text from parts, and preserve new user prompts in conversation history